翻訳と辞書 |
Markov chains on a measurable state space : ウィキペディア英語版 | Markov chains on a measurable state space A Markov chain on a measurable state space is a discrete-time-homogenous Markov chain with a measurable space as state space. == History ==
The definition of Markov chains has evolved during the 20th century. In 1953 the term Markov chain was used for stochastic processes with discrete or continuous index set, living on a countable or finite state space, see Doob.〔Joseph L. Doob: ''Stochastic Processes''. New York: John Wiley & Sons, 1953.〕 or Chung.〔Kai L. Chung: ''Markov Chains with Stationary Transition Probabilities. Second edition. Berlin: Springer-Verlag, 1974.〕 Since the late 20th century it became more popular to consider a Markov chain as a stochastic process with discrete index set, living on a measurable state space.〔Sean Meyn and Richard L. Tweedie: ''Markov Chains and Stochastic Stability''. 2nd edition, 2009.〕〔Daniel Revuz: ''Markov Chains''. 2nd edition, 1984.〕〔Rick Durrett: ''Probability:Theory and Examples' Fourth edition, 2005.〕
抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Markov chains on a measurable state space」の詳細全文を読む
スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース |
Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.
|
|